perm filename COMPUT.NS[F81,JMC] blob sn#632508 filedate 1981-12-26 generic text, type T, neo UTF8
n529  0103  26 Dec 81
BC-COMPUTE-1stadd-12-26
    X X X TO GRASP THEM.
    But some theologians and philosophers believe that accelerating
technology also possesses a dark side, and that these negative
aspects may one day pose a grave threat to what they have
traditionally described as ''the creative, human spirit.''
    These worried thinkers fear the possibility that mankind,
increasingly distracted by the incredibly complex machines that it
can now build, might ultimately enslave itself in a world of
technological abstractions - inside a ''technocratic bureaucracy''
from which there would be no escape.
    In such a ''1984 world,'' of course, human identities would be
reduced to statistics; machine logic would supplant human intuition
and imagination in the conduct of human affairs; human consciousness
would be irretrievably ''split,'' with rational, abstract thinking
reigning supreme over other forms of more creative, spontaneous and
emotion-centered thought.
    This is an extremely large, extremely complicated issue, of course;
more than one thinker has suggested that its roots reach all the way
back to a ''fundamental error'' in the development of Western thought
- specifically, back to Plato's crucial distinction between the
material world of objects and the immaterial world of eternal
''Ideas.''
    It was the introduction of this perceptual ''split'' between the
physical and mental worlds, they contend, which in part inspired the
great 17th century French philosopher Rene Descartes to pose his
famous ''Cogito, Ergo Sum'' (''I think, therefore I am''), and thus -
by separating the thinking ''subject'' from the supposedly external
''object'' which it perceives - to emphasize the abstracting,
deductive-inductive reasoning process over other forms of thought. In
this way, they conclude, logical, Cartesian rationality permitted
human beings first to begin manipulating nature with their new
''science'' - and then to create the accelerating juggernaut of
industrial technology with which all Western societies must now
contend.
    The disturbing possibility that the enormous complexities presented
by modern technology must finally overwhelm any effort to analyze
them can be seen in a well-known remark by the great modern German
existentialist philosopher Martin Heidegger, who in a moment of
gloomy pessimism suggested that ''philosophy died with the advent of
technology.''
    Theologian Dr. Kalter shares some, but not all, of Heidegger's
pessimism; when he thinks about computers and what they they
represent, he sometimes speaks in terms of an approaching
''apocalyptic battle'' between the forces of poesis, or intuitive
imagination, and the forces of teknis, or logical rationality - a
battle in which the stakes will be nothing less than ''to finally
decide what the whole of human consciousness is.''
    Computer scientist Dr. O'Rourke, meanwhile, feels much more hopeful
about the technological prospect: ''I don't view technology as good
or bad or anything. It's just a further exploring, in the same way
that you try to explore some question of free will as a philosopher,
or as a theologian, or you go to a new planet, or you build a bridge.
I see all of this as exploration, and I find it enormously exciting.''
    Should technology be regarded as just another harmless form of
''pure science,'' as Dr. O'Rourke suggests?
    Or is there a fundamental difference between pure scientific
research, which - at least in theory - makes its discoveries
accidentally, disinterestedly, and ''applied'' technological research
of the kind which produced, along with many useful tools for mankind,
the hydrogen bomb?
    More than a few modern thinkers have defined these sorts of
questions as pivotal to our age; they suggest that our collective
fate may well lie in the answers - conscious or otherwise - that we
find for them.
    But before we can move in to take a closer look at how the scientist
and the theologian deal with these perplexing difficulties, we need
to say something more about how the revolution in computer technology
has been changing our world in recent years.
    
    The Background
    
    Although primitive calculating machines had been constructed by
French thinker Blaise Pascal in the 17th century and by English
inventor Charles Babbage in the 19th, the world's first electronic
data-processing computers were assembled by two scientists associated
with the Moore School of the University of Pennsylvania, about 35
years ago.
    Developed by J. Presper Eckert and John W. Mauchly, the original
Electronic Numeric Integrator and Computer (ENIAC), generally
considered to be the ''granddaddy'' of today's sleeker, smaller
machines, weighed 30 tons, took up 15,000 square feet of floor space,
needed 15,000 vacuum tubes and drank electricity like a winded
racehorse at a water trough.
    The unwieldiness and inefficiency of these early machines can be
imagined in a reporter's 1951 description of one of the first models
as ''a contraption that looks like a combination pipe organ console,
a linotype machine and a telephone switchboard - with all of it
hooked up to a Buck Rogers typewriter.''
    (MORE)
    
nyt-12-26-81 0403est
***************

n530  0115  26 Dec 81
BC-COMPUTE-6takes-12-26
    
    By Tom Nugent
    (c) 1981, The Baltimore Sun (Field News Service)
    
    The Problem
    
    If you ask Dr. Joseph O'Rourke, a computer scientist, to describe
the impact of rapidly accelerating computer technology on modern
life, he will tell you these sorts of things:
    ''I'm a scientist. I believe in a mechanistic explanation for human
life. I feel that humans have free will, consciousness - but I also
believe that these things derive entirely from our neurons, our
brain, our body. There's no other explanation, no outside explanation.
    ''If a human can have free will strictly out of mechanistic reasons,
why can't a computer? How would it be different? I believe that free
will, consciousness and many of these very vague, mentalistic things
are properties that emerge from tremendous complexity. Computers are
nowhere near the state of having a consciousness now. But I see no
reason why they could not approach that. And at some point we may
have to say, 'Yes, this computer is conscious, there's no other
reasonable explanation for what it's doing.'
    ''Does a computer have a soul? Not today. But I see no reason why it
couldn't in the future.''
    That's Dr. Joseph O'Rourke, the Johns Hopkins University professor
of computer science, and that's one kind of assessment.
    But if you ask Dr. Richard Kalter, a Harvard-trained, Episcopal
theologian, to talk about the impact of computer technology on modern
life, you'll hear some other things:
    ''I don't think that in itself technology is a good thing or a bad
thing; I think it is both. But I think that above all, right now, we
need to understand what is propelling that movement along, and what
are the real issues involved, and what we are trying to solve with
it. The problem of human frailty? That a computer can supposedly act
with no mistakes, can be clear and precise; that it can handle
things, complications that a culture is no longer able to handle? It
has a theological significance, then, in the sense that it is a kind
of salvation for a complicated culture that can no longer hope to
handle itself.
    ''On the other hand, what some are now afraid of is that this same
technology is going to reduce the need for reflection, for insight,
for wisdom, for the kinds of leisured activities that make human
intercourse not only possible, but necessary. And if a robot-thing
now can be built to do something that a human hand does, is it also
possible to say that we can construct our technology in such a way
that there is no longer the need for us to sit here and ponder any
longer? That things like reflection, imagination ... that these
things no longer have any necessary relation to the actual affairs of
men?...
    ''I sometimes think we are in a real Dark Ages, and that there will
be a necessity for the Celtic monks to go underground for another 600
years to preserve beauty and imagination.''
    The computer scientist meets the theologian.
    It happened a few weeks ago, in a conference room at the Maryland
Art Institute in Baltimore, when an extraordinary new lecture series
entitled ''The Impact of Cultural Change on the Human Spirit'' began
with a wide-ranging exploration of the latest, ''state of the art''
developments in computer technology - and of the effect these
developments can be expected to have on the world in which they are
taking place.
    During the lecture portion of the program, Dr. O'Rourke showed
dozens of color slides demonstrating how modern thinking machines
have learned to draw and paint (the field is called ''computer
graphics'') better than most human beings can. His slides illustrated
how computers, commanded entirely by abstract, mathematically based
programs, can be taught to paint not only brightly colored
geometrical designs, but even gorgeous still-lifes - a bowl of
oranges, a leafy tree, a sunlit seacoast with the waves crashing in.
    The Maryland Institute students, many of whom spend their school
days learning how to draw and paint, were thoroughly impressed.
    They were also struck by Dr. O'Rourke's descriptions of two other
state-of-the-art developments: ''computer vision,'' in which the
machines analyze complex photographic data in order to ''see'' what
is around them (a skill that will prove especially useful to mobile
robots) and ''artificial intelligence,'' in which computers can be
programmed to do everything from playing chess to making decisions on
a manufacturing assembly line to designing and building other
computers.
    Indeed, suggested Dr. O'Rourke, the day may not be far off when
computers will figure out how to write novels and poems - and even to
begin asking the kinds of questions that have both delighted and
tormented human philosophers throughout the ages: What am I? Where
did I come from? Where am I going?
    The mood of the symposium seemed upbeat, exploratory, even playful.
''I like opening doors and seeing what's behind them,'' said
moderator Dr. Kalter, who taught ''moral and systematic theology'' at
Yale University for many years before signing on last fall as the
Institute's theologian-in-residence. ''As a matter of fact, I think
that one good way to describe theology is simply to call it the
discovery of openness.''
    For the most part, however, the evening belonged to the new
machines. As Dr. O'Rourke demonstrated again and again during his
lecture, the potential benefits of modern computer technology - in
communications, in manufacturing, in transportation, in
entertainment, in art - now seem much larger than our ability to
grasp them.
    (MORE)
    
nyt-12-26-81 0415est
***************

n533  0144  26 Dec 81
BC-COMPUTE-2dadd-12-26
    X X X BUCK ROGERS TYPEWRITER.''
    But the computer age had been launched. And during the next three
decades, as computer-builders made one quantum jump in design after
the next, the machines would grow increasingly smaller, faster and
more energy-efficient.
    The key to their rapid evolution lay in their very essence; it lay
in the fact that computers operate on the binary number system, the
two numbers of which (one or zero) are represented by switches that
can be turned to either the on or the off position.
    The speed of the computer, then, depends, at least in part, on how
quickly you can open and close the switches: quicker switches mean
quicker computers.
    As the machines evolved, their designers went from primitive
mechanical switches (''relays'') to faster vacuum tubes, to even
faster transistors - and, finally, to today's ''integrated
circuits,'' which are actually platforms deisgned to carry entire
arrays of switches in combinations that accelerate the onoff process
enormously.
    Along the evolutionary road, the primitive ENIAC shrank to the size
of today's average home computer, a device which weighs less than 30
pounds and uses about one-thousandth the electrical energy consumed
by the original electronic granddaddy.
    (The evolution continues today, of course. ''They're getting
amazingly complex,'' explains Dr. O'Rourke. ''In some cases, in fact,
they're butting up against fundamental limits - such as the speed of
light, or the speed of electricity in the wires. You can jam the
circuits closer and closer together, but then you start to get a
problem with heat. Probably the ultimate would be a computer of about
one cubic inch, which will be more powerful than the things that once
filled up an entire room, and it would be in a bath of liquid helium,
perhaps, so that it's super-cooled and superconductant. I understand
that IBM is already working toward this.'')
    As the machines grew smaller, faster and less expensive, of course,
they began to invade every phase of American life. A single statistic
demonstrates the extent of that invasion: in the late 1950s, as the
second-generation, transistor-based computers were just taking off,
Americans were spending about $1 billion annually on data processing
equipment; by 1980, they were spending more than $55 billion a year
on the same kinds of digital gear.
    (In 1980, jumbo-sized IBM earned 63 percent of the dollars spent on
general-purpose, American-made computers; seven other firms -
Honeywell, Sperry-Univac, Burroughs, NCR, Control Data, GE and RCA
(also known in the trade as ''The Seven Dwarfs'') divided the rest.)
    Today computers can be found at work in - and often in command of -
many areas of American life. Small microprocessors, for example,
control key components in washing-drying machines, in automobiles,
calculators, watches, stereos, traffic signals, and even children's
toys.
    Video games provide another example of the extent of the boom: only
a decade after the invention of ''Pong,'' the first computerized
television game, more than four million Americans now own the
programmable video display terminals on which the computer contests
are waged. And the video-computer apotheosis now appears to be in
sight: according to recent reports, the McDonald's hamburger chain
has begun negotiating with Atari for a video system, eventually to be
installed in every restaurant, in which a machine will first take
your meal order (you'll punch it in on the keyboard) - and then play
a videogame with you while you wait for your burgers and fries.
    In industry, the move to computer technology has been just as
pervasive and just as sudden. While General Motors and General
Electric continue to explore better techniques for computerizing
assembly lines and other manufacturing processes, Japanese ''robot
factories'' (in one of them, 10,000 computer-directed robots now work
24 hours a day, and without coffee breaks) are rapidly becoming a
commonplace in that country's efficiency-minded economy.
    The list of benefits provided by these new applications of computer
technology seems almost endless.
    But so does the list of problems which accompanies it.
    
    The Computer Scientist
    
    He sits with his feet up.
    This is a Monday afternoon on the campus of The Johns Hopkins
University, and Computer Science Professor Joe O'Rourke turns out to
be a very young-looking, 30-year-old man in an ordinary plaid
workshirt who wears big eyeglasses, trim slacks, and curly,
close-cropped hair. Dr. O'Rourke looks neat and organized and very
clean. But he also looks very relaxed, easygoing - not at all like
the stiff, rigid-faced zombie of the computer programmer cliches.
    ''It's all very hard to predict,'' says Joe O'Rourke, who attended
an eastern liberal arts college before earning a Ph.D. in computer
science at the University of Pennsylvania. He speaks softly, in a
high-pitched but genial voice; he laughs a great deal. He points out
that he's married to a historian; he also points out that his brother
is a dedicated sculptor - but a sculptor who these days often
executes his art works on the video display computer terminals that
Joe taught him how to use.
    (MORE)
    
nyt-12-26-81 0444est
***************

n534  0155  26 Dec 81
BC-COMPUTE-3dadd-12-26
    X X X HOW TO USE.
    Then the professor gets down to business: ''It's all very hard to
predict. I think the easy thing to predict would be to say that life
now under the impact of computer technology will become more oriented
toward numbers. You know, 'dehumanizing,' words like that.
    ''But I don't really think that will happen. And I think, just
looking at it - like the television games - you can see the
turnaround there. They're fun to play with! There's a lot of
interaction. Well, that's a computer. It's not the traditional idea
of a computer, where you say, 'You're a number, a Social Security
number,' that type of thing, which has been the common person's
interaction with computers until now.
    ''It's going to turn around, I think.'' He grins happily at the
prospect. ''You're going to find that they're a lot of fun. And this
will increase to the point where they'll be very useful, very
friendly - often more friendly than other people.
    ''But I do think there's much potential for damage. One thing that
worries me very much is the possibility of having computers take over
a lot of menial jobs without replacement of those jobs with other
work. I do not see the country properly planning for that, and unless
we move the work force toward other areas, I think it could cause a
tremendous social problem.''
    He measures you calmly, then: ''Soon they'll be able to write
newspaper articles better than newspaper reporters, and they'll be
able to do research better than me. What happens then?
    ''Still, I think it's very exciting. It's almost like planetary
exploration to me; it carries the same type of excitement.''
    Next question: does Dr. O'Rourke believe - as some scientists have
recently suggested - that computers may actually represent the next
stage in human evolution? What happens, for example, when they begin
to do, not just some things, but most things better than we can?
    He shakes his head. ''I don't think they're the next stage, no.
However, I do say that computers may become a species of their own,
in some sense. But they will not be the next step on the evolutionary
ladder. They would have their own niche. I think it would overlap
with ours, but it would not supplant us in any way.
    ''I think that what computers will be good at will be different from
what humans are good at. We can live side by side. If they write
novels, for example, we may not be interested in them. If they prove
theorems, we may find them absurdly complex, no elegance to them. If
they write poems, they may be riddled with puns to the point where we
find them ridiculous.
    ''And so, even though they will have their own culture, it will be
different from our culture. They will not have human values. They
will not have to find a date for the high school prom, for example -
you know, the things that really affect your life in so many ways.''
    You stare at him. Novels? Poems? How could a computer write a novel?
You create a novel, don't you? Would it actually be possible to
calculate one, using the kind of logic that computers use?
    ''I think you could,'' the doctor calmly suggests. ''Some computer
scientists might disagree here, but I definitely believe it can be
done. What they do is, they simply build randomness into the various
choices the machine makes. They use what are called 'pseudo-random
number generators.' And although these things must repeat themselves
eventually, they only repeat themselves over a period that is larger
than the age of the universe. And so for all practical purposes, they
are, in fact, random.''
    Professor O'Rourke says that he's not particularly bothered by the
prospect of computers someday living their own lives, writing their
own novels, constructing their own philosophies: ''I simply view
technology as further exploration of the world. I mean, it's true
that computers are our own creation. And yet they're made up of the
stuff of the world. And you can view it as an exploration of some of
the far-off consequences of the way matter is put together, the way
electrons get pushed around. And so we're taking the stuff of the
world, we're forming it into something, and we're exploring it.
    ''As a scientist, I have to believe that any dehumanization that
takes place along the way is independent of the technology. That we
may make the mistake of using it against ourselves sometimes, that's
all. And that it doesn't have to be done this way.
    ''The technology isn't really the problem - it's us.''
    
    The Theologian
    
    He refers to himself, whenever he talks theology, simply as an
''explorer of openness.''
    For this interview, on a bitterly cold Friday afternoon in early
December, we're sitting in a small neighbhorhood restaurant near his
downtown Baltimore apartment building. This is Dr. Richard Kalter,
who studied theology and philosophy at Harvard and then (under both
Reinhold Niebuhr and Paul Tillich) at the Union Theological Seminary
- and who years later resigned his teaching post at Yale University
to come to Baltimore to share his insights with undergraduate art
students at the Maryland Institute.
    (MORE)
    
nyt-12-26-81 0455est
***************

n535  0206  26 Dec 81
BC-COMPUTE-4thadd-12-26
    X X X THE MARYLAND INSTITUTE.
    These days, he says, he's more interested in what he doesn't know
than what he does.
    ''I'm drawn,'' he had told the Art Institute crowd during the
computer symposium, ''to the positive qualities of unknownness. I
think the real power that produces new understanding is always on the
side of the unknown. It's the known that finally does us in - what we
think we know.''
    Now, between bites of his noontime sandwich, this slender,
softspoken man with the thinning gray hair and the perpetual squint
of the professional thinker, explains first that ''my ignorance of
this field is overwhelming.''
    But yes, he does have some feelings about how these new
technological forces - he times he refers to them as ''our
obsession'' - are affecting all of us now.
    ''What I do think,'' he says after a long introductory pause, ''is
that technology today, before it represents an end to anything, is
simply the honest confirmation and conclusion of many trends that
have been at work for a long time in the development of human
consciousness and its ability. Technology has grown and developed and
become more sophisticated as a result of many factors that we somehow
forget have always been there...
    ''I think that if technology is an outgrowth, maybe even an
obsessive outgrowth of certain trends, then the question for me is:
How will we deal with our obsessiveness? How are we going to use that
energy? I think these are the same kinds of questions that people
asked the peer scientists about atomic energy - and the answer was
that we misused them. And I would say that that's what we're doing
here, yes.''
    Like Martin Heidegger, who said that philosophy had ''died'' with
the advent of modern technology, Dr. Kalter worries that the new
machines might ultimately strip man of his spiritual depth. ''I'm
rather pessimistic, to tell you the truth. I'm not pessimistic about
pure science, about the wonder of discovery, not at all. But you ask
me: Are these forces demonic?
    ''Well, I think we've been fighting that since Descartes. I mean the
fact that the bureaucratic mind takes over - we had that long before
computers, you know. What I am trying to say is that technology is
demonic only because it culminates in a kind of intensity, a focus
for what we have been playing around with for a very long time - the
subject-object split.''
    He pauses, then shakes his head in puzzlement. ''In some ways, yes,
I do think that these computer people are a species of zombie. I
really do. I don't know if you've ever been in a room with these
white-coated people. The only thing they can really say for
themselves that's human, is that they have a high percentage of
rapists among them. Because they're so denuded that when they get
outside the room, apparently, they do have a high degree of
sexuality. They are strange people because they stare at the little
machines all the time.''
    Does Dr. Kalter see a ''solution'' for what he describes as the
dangerously accelerating ''subject-object split'' now represented by
modern technology?
    ''To be Hegelian about this, I would point out that you always have
to have an antithesis, which will force, of itself, a new kind of
synthesis. It always does. And so, rather than saying that these
forces represent a dead end, you might say that they could be the key
to a new personalism - and thus a benefit.
    ''You know Hegel's saying: 'The owl of Minerva takes her flight at
eventide?' Wisdom only comes in the evening. I think it's a way of
saying that the breakthrough, reflective insight, comes to us only at
the end, often in an apocalyptic way. And I think Hegel in that
wonderful symbolism was really talking about the role of philosophy
as you try to understand historical movement.
    ''I wonder if the final battle will be the battle between the
'Imagination Man' and the 'Logic Man.' It's a pointless battle, of
course, because it's the same old split - but it may be an absolutely
necessary battle to get us to finally see what we've done to
ourselves in making these distortions. ... To decide what the whole
of human consciousness is, I think that is going to be the question
that's going to be fought out in an apocalyptic way here.
    ''And if those are the battle lines, then I think it behooves us all
to get into the battle and see. But the battle I don't want to fight,
a dumb battle - because I think it's really no battle at all - is the
battle over whether one side or the other should be destroyed, or
allowed to exist.''
    After more than an hour of talking, he's finally finished his
sandwich.
    Almost immediately, of course, he apologizes for his remarks.
    You must realize, he points out, that he understands almost nothing
about these incredibly complex problems.
    ''I guess I'm holding out for a little more clarity in my own mind.''
    
    The Solution: The Dream
    
    At the height of the effort to put together this difficult story,
your correspondent fell into troubled sleep. He dreamed the following
dream, the details of which are rendered exactly:
    (MORE)
    
nyt-12-26-81 0506est
***************

n536  0210  26 Dec 81
BC-COMPUTE-5thadd-12-26
    X X X ARE RENDERED EXACTLY:
    He was wandering down endless, tunnel-like passageways, and looking
into one face after another for something human.
    But the faces were cold. They seemed to be made of metal, or perhaps
of stone.
    The hands moved slowly, slowly, like hands of metal or stone.
    Their expressions were indifferent. He could not reach them. And the
terror mounted, as he realized that they were indifferent not because
of malice, or hatred, or even because of preoccupation with other
things.
    But simply because they were machines.
    Then, suddenly, he was lifted into the air.
    Rescued! It was one of those huge, World War I vintage biplanes, a
shuddering, lurching, staggering, airborne device. It was crazy, but
it worked. It was flying. And it was flying him away from the cold
machines, away to safety and warmth at last. . . .
    Still, he didn't trust this strange new device. Would it remain
airborne long enough to save him? He turned to ask the others about
this weird flight they were all taking . . . and then he heard them
singing.
    They were all children.
    They were tiny infants, babies of two and three, hanging from every
strut, every wing and fin and shining surface. Babies everywhere, and
all of them singing:
    ''The children's aircraft!''
    They sang the same words over and over again:
    ''The children's aircraft!''
    Their voices hummed and pinged and shimmered in a thousand different
mysterious keys:
    ''The children's aircraft!''
    And together they flew off to another kingdom - to the place that no
one can ever remember.
    END
    
nyt-12-26-81 0510est
***************